Accelerated and Inexact Soft-Impute for Large-Scale Matrix and Tensor Completion

نویسندگان

  • Quanming Yao
  • James T. Kwok
چکیده

Matrix and tensor completion aim to recover a low-rank matrix / tensor from limited observations and have been commonly used in applications such as recommender systems and multi-relational data mining. A state-of-the-art matrix completion algorithm is Soft-Impute, which exploits the special “sparse plus low-rank” structure of the matrix iterates to allow efficient SVD in each iteration. Though Soft-Impute is a proximal algorithm, it is generally believed that acceleration destroys the special structure and is thus not useful. In this paper, we show that Soft-Impute can indeed be accelerated without comprising this structure. To further reduce the iteration time complexity, we propose an approximate singular value thresholding scheme based on the power method. Theoretical analysis shows that the proposed algorithm still enjoys the fast O(1/T ) convergence rate of accelerated proximal algorithms. We further extend the proposed algorithm to tensor completion with the scaled latent nuclear norm regularizer. We show that a similar “sparse plus low-rank” structure also exists, leading to low iteration complexity and fast O(1/T ) convergence rate. Extensive experiments demonstrate that the proposed algorithm is much faster than Soft-Impute and other state-of-the-art matrix and tensor completion algorithms.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Accelerated Inexact Soft-Impute for Fast Large-Scale Matrix Completion

A 3 AIS-Impute (proposed algorithm). Require: partially observed matrix O, parameter λ, decay parameter ν ∈ (0, 1), threshold ; 1: [U0, λ0, V0] = rank-1 SVD(PΩ(O)); 2: c = 1, ̃0 = ‖PΩ(O)‖F , X0 = X1 = λ0U0V > 0 ; 3: for t = 1, 2, . . . do 4: ̃t = ν̃0; θt = (c− 1)/(c+ 2); 5: λt = ν(λ0 − λ) + λ; 6: Yt = Xt + θt(Xt −Xt−1); 7: Z̃t = Yt + PΩ(O − Yt); 8: Vt−1 = Vt−1 − Vt(VtVt−1), remove zero columns; ...

متن کامل

Efficient tensor completion: Low-rank tensor train

This paper proposes a novel formulation of the tensor completion problem to impute missing entries of data represented by tensors. The formulation is introduced in terms of tensor train (TT) rank which can effectively capture global information of tensors thanks to its construction by a wellbalanced matricization scheme. Two algorithms are proposed to solve the corresponding tensor completion p...

متن کامل

An Inexact Accelerated Proximal Gradient Method for Large Scale Linearly Constrained Convex SDP

The accelerated proximal gradient (APG) method, first proposed by Nesterov for minimizing smooth convex functions, later extended by Beck and Teboulle to composite convex objective functions, and studied in a unifying manner by Tseng, has proven to be highly efficient in solving some classes of large scale structured convex optimization (possibly nonsmooth) problems, including nuclear norm mini...

متن کامل

Spectral Regularization Algorithms for Learning Large Incomplete Matrices

We use convex relaxation techniques to provide a sequence of regularized low-rank solutions for large-scale matrix completion problems. Using the nuclear norm as a regularizer, we provide a simple and very efficient convex algorithm for minimizing the reconstruction error subject to a bound on the nuclear norm. Our algorithm Soft-Impute iteratively replaces the missing elements with those obtai...

متن کامل

Tensor Completion

The purpose of this thesis is to explore the methods to solve the tensor completion problem. Inspired by the matrix completion problem, the tensor completion problem is formulated as an unconstrained nonlinear optimization problem, which finds three factors that give a low-rank approximation. Various of iterative methods, including the gradient-based methods, stochastic gradient descent method ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1703.05487  شماره 

صفحات  -

تاریخ انتشار 2017